An improved subgradient method for constrained nondifferentiable optimization

نویسندگان

  • Sehun Kim
  • Bong-sik Um
چکیده

Polyak's subgradient method for constrained nondifferentiable optimization problems is modified in one respect to improve the computational efficiency. The two consecutive projection operations in Polyak method are combined into a single projection operation. The resulting algorithm has a convergence property which is strictly stronger than that of the original Polyak method. A computational test shows significant improvement both in the number of iterations and in the CPU time.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Approximate Primal Solutions and Rate Analysis for Dual Subgradient Methods

In this paper, we study methods for generating approximate primal solutions as a by-product of subgradient methods applied to the Lagrangian dual of a primal convex (possibly nondifferentiable) constrained optimization problem. Our work is motivated by constrained primal problems with a favorable dual problem structure that leads to efficient implementation of dual subgradient methods, such as ...

متن کامل

Multiple Cuts in Separating Plane Algorithms

This paper presents an extended version of the separation plane algorithms for subgradientbased finite-dimensional nondifferentiable convex blackbox optimization. The extension introduces additional cuts for epigraph of the conjugate of objective function which improve the convergence of the algorithm. The case of affine cuts is considered in more details and it is shown that it requires soluti...

متن کامل

Inexact subgradient methods for quasi-convex optimization problems

In this paper, we consider a generic inexact subgradient algorithm to solve a nondifferentiable quasi-convex constrained optimization problem. The inexactness stems from computation errors and noise, which come from practical considerations and applications. Assuming that the computational errors and noise are deterministic and bounded, we study the effect of the inexactness on the subgradient ...

متن کامل

Nondifferentiable Optimization via Approximation*

Optimization problems with nondifferentiable cost functionals, particularly minimax problems, have received considerable attention recently since they arise naturally in a variety of contexts. Optimality conditions for such problems have been derived by several authors while a number of computational methods have been proposed for their solution (the reader is referred to [1] for:a fairly compl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Oper. Res. Lett.

دوره 14  شماره 

صفحات  -

تاریخ انتشار 1993